pip install tensorflow
Collecting tensorflow Downloading tensorflow-2.8.0-cp38-cp38-win_amd64.whl (438.0 MB) Collecting grpcio<2.0,>=1.24.3 Downloading grpcio-1.44.0-cp38-cp38-win_amd64.whl (3.4 MB) Collecting tf-estimator-nightly==2.8.0.dev2021122109 Downloading tf_estimator_nightly-2.8.0.dev2021122109-py2.py3-none-any.whl (462 kB) Collecting wrapt>=1.11.0 Downloading wrapt-1.14.0-cp38-cp38-win_amd64.whl (36 kB) Collecting google-pasta>=0.1.1 Downloading google_pasta-0.2.0-py3-none-any.whl (57 kB) Collecting absl-py>=0.4.0 Downloading absl_py-1.0.0-py3-none-any.whl (126 kB) Collecting protobuf>=3.9.2 Downloading protobuf-3.20.0-cp38-cp38-win_amd64.whl (904 kB) Requirement already satisfied: setuptools in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from tensorflow) (58.0.4) Collecting keras-preprocessing>=1.1.1 Downloading Keras_Preprocessing-1.1.2-py2.py3-none-any.whl (42 kB) Collecting tensorflow-io-gcs-filesystem>=0.23.1 Downloading tensorflow_io_gcs_filesystem-0.24.0-cp38-cp38-win_amd64.whl (1.5 MB) Collecting tensorboard<2.9,>=2.8 Downloading tensorboard-2.8.0-py3-none-any.whl (5.8 MB) Collecting libclang>=9.0.1 Downloading libclang-13.0.0-py2.py3-none-win_amd64.whl (13.9 MB) Collecting termcolor>=1.1.0 Downloading termcolor-1.1.0.tar.gz (3.9 kB) Collecting flatbuffers>=1.12 Downloading flatbuffers-2.0-py2.py3-none-any.whl (26 kB) Collecting h5py>=2.9.0 Downloading h5py-3.6.0-cp38-cp38-win_amd64.whl (2.8 MB) Collecting astunparse>=1.6.0 Downloading astunparse-1.6.3-py2.py3-none-any.whl (12 kB) Collecting gast>=0.2.1 Downloading gast-0.5.3-py3-none-any.whl (19 kB) Requirement already satisfied: six>=1.12.0 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from tensorflow) (1.16.0) Collecting opt-einsum>=2.3.2 Downloading opt_einsum-3.3.0-py3-none-any.whl (65 kB) Requirement already satisfied: typing-extensions>=3.6.6 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from tensorflow) (4.1.1) Collecting numpy>=1.20 Downloading numpy-1.22.3-cp38-cp38-win_amd64.whl (14.7 MB) Collecting keras<2.9,>=2.8.0rc0 Downloading keras-2.8.0-py2.py3-none-any.whl (1.4 MB) Requirement already satisfied: wheel<1.0,>=0.23.0 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from astunparse>=1.6.0->tensorflow) (0.37.1) Collecting requests<3,>=2.21.0 Downloading requests-2.27.1-py2.py3-none-any.whl (63 kB) Collecting tensorboard-plugin-wit>=1.6.0 Downloading tensorboard_plugin_wit-1.8.1-py3-none-any.whl (781 kB) Collecting tensorboard-data-server<0.7.0,>=0.6.0 Downloading tensorboard_data_server-0.6.1-py3-none-any.whl (2.4 kB) Collecting markdown>=2.6.8 Downloading Markdown-3.3.6-py3-none-any.whl (97 kB) Collecting google-auth-oauthlib<0.5,>=0.4.1 Downloading google_auth_oauthlib-0.4.6-py2.py3-none-any.whl (18 kB) Collecting werkzeug>=0.11.15 Downloading Werkzeug-2.1.1-py3-none-any.whl (224 kB) Collecting google-auth<3,>=1.6.3 Downloading google_auth-2.6.2-py2.py3-none-any.whl (156 kB) Collecting rsa<5,>=3.1.4 Downloading rsa-4.8-py3-none-any.whl (39 kB) Collecting cachetools<6.0,>=2.0.0 Downloading cachetools-5.0.0-py3-none-any.whl (9.1 kB) Collecting pyasn1-modules>=0.2.1 Downloading pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB) Collecting requests-oauthlib>=0.7.0 Downloading requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB) Requirement already satisfied: importlib-metadata>=4.4 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from markdown>=2.6.8->tensorboard<2.9,>=2.8->tensorflow) (4.11.3) Requirement already satisfied: zipp>=0.5 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<2.9,>=2.8->tensorflow) (3.7.0) Collecting pyasn1<0.5.0,>=0.4.6 Downloading pyasn1-0.4.8-py2.py3-none-any.whl (77 kB) Collecting urllib3<1.27,>=1.21.1 Downloading urllib3-1.26.9-py2.py3-none-any.whl (138 kB) Requirement already satisfied: certifi>=2017.4.17 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from requests<3,>=2.21.0->tensorboard<2.9,>=2.8->tensorflow) (2021.10.8) Collecting idna<4,>=2.5 Downloading idna-3.3-py3-none-any.whl (61 kB) Collecting charset-normalizer~=2.0.0 Downloading charset_normalizer-2.0.12-py3-none-any.whl (39 kB) Collecting oauthlib>=3.0.0 Downloading oauthlib-3.2.0-py3-none-any.whl (151 kB) Building wheels for collected packages: termcolor Building wheel for termcolor (setup.py): started Building wheel for termcolor (setup.py): finished with status 'done' Created wheel for termcolor: filename=termcolor-1.1.0-py3-none-any.whl size=4848 sha256=485dc65a087ecfada9bfb4ddc3dd182f907d01193685af9436dca22a7c5f9b41 Stored in directory: c:\users\krudko\appdata\local\pip\cache\wheels\a0\16\9c\5473df82468f958445479c59e784896fa24f4a5fc024b0f501 Successfully built termcolor Installing collected packages: urllib3, pyasn1, idna, charset-normalizer, rsa, requests, pyasn1-modules, oauthlib, cachetools, requests-oauthlib, google-auth, werkzeug, tensorboard-plugin-wit, tensorboard-data-server, protobuf, numpy, markdown, grpcio, google-auth-oauthlib, absl-py, wrapt, tf-estimator-nightly, termcolor, tensorflow-io-gcs-filesystem, tensorboard, opt-einsum, libclang, keras-preprocessing, keras, h5py, google-pasta, gast, flatbuffers, astunparse, tensorflow Successfully installed absl-py-1.0.0 astunparse-1.6.3 cachetools-5.0.0 charset-normalizer-2.0.12 flatbuffers-2.0 gast-0.5.3 google-auth-2.6.2 google-auth-oauthlib-0.4.6 google-pasta-0.2.0 grpcio-1.44.0 h5py-3.6.0 idna-3.3 keras-2.8.0 keras-preprocessing-1.1.2 libclang-13.0.0 markdown-3.3.6 numpy-1.22.3 oauthlib-3.2.0 opt-einsum-3.3.0 protobuf-3.20.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 requests-2.27.1 requests-oauthlib-1.3.1 rsa-4.8 tensorboard-2.8.0 tensorboard-data-server-0.6.1 tensorboard-plugin-wit-1.8.1 tensorflow-2.8.0 tensorflow-io-gcs-filesystem-0.24.0 termcolor-1.1.0 tf-estimator-nightly-2.8.0.dev2021122109 urllib3-1.26.9 werkzeug-2.1.1 wrapt-1.14.0 Note: you may need to restart the kernel to use updated packages.
pip install pandas
Collecting pandas Downloading pandas-1.4.2-cp38-cp38-win_amd64.whl (10.6 MB) Requirement already satisfied: python-dateutil>=2.8.1 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from pandas) (2.8.2) Collecting pytz>=2020.1 Downloading pytz-2022.1-py2.py3-none-any.whl (503 kB) Requirement already satisfied: numpy>=1.18.5 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from pandas) (1.22.3) Requirement already satisfied: six>=1.5 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from python-dateutil>=2.8.1->pandas) (1.16.0) Installing collected packages: pytz, pandas Successfully installed pandas-1.4.2 pytz-2022.1 Note: you may need to restart the kernel to use updated packages.
pip install pandas_datareader
Note: you may need to restart the kernel to use updated packages. Collecting pandas_datareader Downloading pandas_datareader-0.10.0-py3-none-any.whl (109 kB) Requirement already satisfied: requests>=2.19.0 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from pandas_datareader) (2.27.1) Requirement already satisfied: pandas>=0.23 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from pandas_datareader) (1.4.2) Collecting lxml Downloading lxml-4.8.0-cp38-cp38-win_amd64.whl (3.6 MB) Requirement already satisfied: pytz>=2020.1 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from pandas>=0.23->pandas_datareader) (2022.1) Requirement already satisfied: numpy>=1.18.5 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from pandas>=0.23->pandas_datareader) (1.22.3) Requirement already satisfied: python-dateutil>=2.8.1 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from pandas>=0.23->pandas_datareader) (2.8.2) Requirement already satisfied: six>=1.5 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from python-dateutil>=2.8.1->pandas>=0.23->pandas_datareader) (1.16.0) Requirement already satisfied: certifi>=2017.4.17 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from requests>=2.19.0->pandas_datareader) (2021.10.8) Requirement already satisfied: urllib3<1.27,>=1.21.1 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from requests>=2.19.0->pandas_datareader) (1.26.9) Requirement already satisfied: idna<4,>=2.5 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from requests>=2.19.0->pandas_datareader) (3.3) Requirement already satisfied: charset-normalizer~=2.0.0 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from requests>=2.19.0->pandas_datareader) (2.0.12) Installing collected packages: lxml, pandas-datareader Successfully installed lxml-4.8.0 pandas-datareader-0.10.0
pip install plotly
Collecting plotly Downloading plotly-5.6.0-py2.py3-none-any.whl (27.7 MB) Requirement already satisfied: six in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from plotly) (1.16.0) Collecting tenacity>=6.2.0 Downloading tenacity-8.0.1-py3-none-any.whl (24 kB) Installing collected packages: tenacity, plotly Successfully installed plotly-5.6.0 tenacity-8.0.1 Note: you may need to restart the kernel to use updated packages.
pip install sklearn
Collecting sklearn Downloading sklearn-0.0.tar.gz (1.1 kB) Collecting scikit-learn Downloading scikit_learn-1.0.2-cp38-cp38-win_amd64.whl (7.2 MB) Collecting threadpoolctl>=2.0.0 Downloading threadpoolctl-3.1.0-py3-none-any.whl (14 kB) Collecting joblib>=0.11 Downloading joblib-1.1.0-py2.py3-none-any.whl (306 kB) Requirement already satisfied: numpy>=1.14.6 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from scikit-learn->sklearn) (1.22.3) Collecting scipy>=1.1.0 Downloading scipy-1.8.0-cp38-cp38-win_amd64.whl (36.9 MB) Building wheels for collected packages: sklearn Building wheel for sklearn (setup.py): started Building wheel for sklearn (setup.py): finished with status 'done' Created wheel for sklearn: filename=sklearn-0.0-py2.py3-none-any.whl size=1310 sha256=065caeee8f1f6fcb827252fdaa23c4d010c774c78c54f96988e1622db3b6a5c4 Stored in directory: c:\users\krudko\appdata\local\pip\cache\wheels\22\0b\40\fd3f795caaa1fb4c6cb738bc1f56100be1e57da95849bfc897 Successfully built sklearn Installing collected packages: threadpoolctl, scipy, joblib, scikit-learn, sklearn Successfully installed joblib-1.1.0 scikit-learn-1.0.2 scipy-1.8.0 sklearn-0.0 threadpoolctl-3.1.0 Note: you may need to restart the kernel to use updated packages.
import numpy as np
import pandas as pd
import os
import matplotlib.pyplot as plt
import pandas_datareader as web
import datetime as dt
from sklearn.preprocessing import MinMaxScaler
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Dropout, LSTM
from tensorflow.keras.callbacks import ModelCheckpoint, EarlyStopping
import matplotlib.dates as dates
import plotly.express as px
import plotly.graph_objects as go
from plotly.subplots import make_subplots
pip install matplotlib
Collecting matplotlib Downloading matplotlib-3.5.1-cp38-cp38-win_amd64.whl (7.2 MB) Requirement already satisfied: pyparsing>=2.2.1 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from matplotlib) (3.0.4) Collecting cycler>=0.10 Downloading cycler-0.11.0-py3-none-any.whl (6.4 kB) Requirement already satisfied: numpy>=1.17 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from matplotlib) (1.22.3) Collecting kiwisolver>=1.0.1 Downloading kiwisolver-1.4.2-cp38-cp38-win_amd64.whl (55 kB) Collecting pillow>=6.2.0 Downloading Pillow-9.1.0-cp38-cp38-win_amd64.whl (3.3 MB) Requirement already satisfied: packaging>=20.0 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from matplotlib) (21.3) Collecting fonttools>=4.22.0 Downloading fonttools-4.31.2-py3-none-any.whl (899 kB) Requirement already satisfied: python-dateutil>=2.7 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from matplotlib) (2.8.2) Requirement already satisfied: six>=1.5 in c:\users\krudko\.conda\envs\krudkoenv\lib\site-packages (from python-dateutil>=2.7->matplotlib) (1.16.0) Installing collected packages: pillow, kiwisolver, fonttools, cycler, matplotlib Successfully installed cycler-0.11.0 fonttools-4.31.2 kiwisolver-1.4.2 matplotlib-3.5.1 pillow-9.1.0 Note: you may need to restart the kernel to use updated packages.
data=pd.read_csv('upload_DJIA_table.csv')
data[0:3]
| Date | Open | High | Low | Close | Volume | Adj Close | |
|---|---|---|---|---|---|---|---|
| 0 | 2016-07-01 | 17924.240234 | 18002.380859 | 17916.910156 | 17949.369141 | 82160000 | 17949.369141 |
| 1 | 2016-06-30 | 17712.759766 | 17930.609375 | 17711.800781 | 17929.990234 | 133030000 | 17929.990234 |
| 2 | 2016-06-29 | 17456.019531 | 17704.509766 | 17456.019531 | 17694.679688 | 106380000 | 17694.679688 |
data['Date']=pd.to_datetime(data['Date'])
data=data.sort_values('Date')
fig=make_subplots(specs=[[{"secondary_y":False}]])
fig.add_trace(go.Scatter(x=data['Date'],y=data['Open'].rolling(window=7).mean(),name="DJIA"),secondary_y=False,)
fig.update_layout(autosize=False,width=900,height=500,title_text="DJIA")
fig.update_xaxes(title_text="year")
fig.update_yaxes(title_text="prices",secondary_y=False)
fig.show()
n=len(data)
train_data=data[(n//20)*17:(n//20)*19]
test_data=data[(n//20)*19:]
fig=make_subplots(specs=[[{"secondary_y":False}]])
fig.add_trace(go.Scatter(x=train_data['Date'],y=train_data['Open'],name="Train"),secondary_y=False,)
fig.add_trace(go.Scatter(x=test_data['Date'],y=test_data['Open'],name="Test"),secondary_y=False,)
fig.update_layout(autosize=False,width=900,height=500,title_text="DJIA")
fig.update_xaxes(title_text="year")
fig.update_yaxes(title_text="prices",secondary_y=False)
fig.show()
test_data[0:3]
| Date | Open | High | Low | Close | Volume | Adj Close | |
|---|---|---|---|---|---|---|---|
| 107 | 2016-01-29 | 16090.259766 | 16466.300781 | 16090.259766 | 16466.300781 | 217940000 | 16466.300781 |
| 106 | 2016-02-01 | 16453.630859 | 16510.980469 | 16299.469727 | 16449.179688 | 114450000 | 16449.179688 |
| 105 | 2016-02-02 | 16420.210938 | 16420.210938 | 16108.440430 | 16153.540039 | 126210000 | 16153.540039 |
print(len(train_data))
print(len(test_data))
198 108
scaler = MinMaxScaler(feature_range=(0,1))
scaled_data = scaler.fit_transform(train_data['Open'].values.reshape(-1,1))
prediction_days = 30
x_train = []
y_train = []
for x in range(prediction_days, len(scaled_data)-5): ######
x_train.append(scaled_data[x-prediction_days:x, 0])
y_train.append(scaled_data[x+5, 0]) ###### predict 5 days after
x_train, y_train = np.array(x_train), np.array(y_train)
x_train = np.reshape(x_train, (x_train.shape[0], x_train.shape[1], 1))
print(x_train.shape)
print(y_train.shape)
(163, 30, 1) (163,)
def LSTM_model():
model = Sequential()
model.add(LSTM(units = 50, return_sequences = True, input_shape = (x_train.shape[1],1)))
model.add(Dropout(0.2))
model.add(LSTM(units = 50, return_sequences = True))
model.add(Dropout(0.2))
model.add(LSTM(units = 50))
model.add(Dropout(0.2))
model.add(Dense(units=1))
return model
model = LSTM_model()
model.summary()
model.compile(optimizer='adam', loss='mean_squared_error', metrics = ['accuracy'])
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm (LSTM) (None, 30, 50) 10400
dropout (Dropout) (None, 30, 50) 0
lstm_1 (LSTM) (None, 30, 50) 20200
dropout_1 (Dropout) (None, 30, 50) 0
lstm_2 (LSTM) (None, 50) 20200
dropout_2 (Dropout) (None, 50) 0
dense (Dense) (None, 1) 51
=================================================================
Total params: 50,851
Trainable params: 50,851
Non-trainable params: 0
_________________________________________________________________
checkpointer = ModelCheckpoint(filepath = 'weights_best.hdf5', verbose = 1, save_best_only = True)
his=model.fit(x_train,y_train,epochs=20,batch_size=32,callbacks=[checkpointer])
Epoch 1/20 5/6 [========================>.....] - ETA: 0s - loss: 0.1939 - accuracy: 0.0063WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 7s 54ms/step - loss: 0.1927 - accuracy: 0.0061 Epoch 2/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0746 - accuracy: 0.0000e+00WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 56ms/step - loss: 0.0740 - accuracy: 0.0000e+00 Epoch 3/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0667 - accuracy: 0.0063WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 70ms/step - loss: 0.0665 - accuracy: 0.0061 Epoch 4/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0584 - accuracy: 0.0000e+00WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 58ms/step - loss: 0.0575 - accuracy: 0.0000e+00 Epoch 5/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0591 - accuracy: 0.0000e+00WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 57ms/step - loss: 0.0591 - accuracy: 0.0000e+00 Epoch 6/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0528 - accuracy: 0.0000e+00WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 63ms/step - loss: 0.0522 - accuracy: 0.0000e+00 Epoch 7/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0485 - accuracy: 0.0000e+00WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 59ms/step - loss: 0.0504 - accuracy: 0.0000e+00 Epoch 8/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0498 - accuracy: 0.0000e+00WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 55ms/step - loss: 0.0502 - accuracy: 0.0000e+00 Epoch 9/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0528 - accuracy: 0.0063 WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 56ms/step - loss: 0.0526 - accuracy: 0.0061 Epoch 10/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0475 - accuracy: 0.0000e+00WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 56ms/step - loss: 0.0474 - accuracy: 0.0000e+00 Epoch 11/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0507 - accuracy: 0.0000e+00WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 57ms/step - loss: 0.0503 - accuracy: 0.0000e+00 Epoch 12/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0478 - accuracy: 0.0000e+00WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 62ms/step - loss: 0.0503 - accuracy: 0.0000e+00 Epoch 13/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0499 - accuracy: 0.0000e+00WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 59ms/step - loss: 0.0496 - accuracy: 0.0000e+00 Epoch 14/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0456 - accuracy: 0.0000e+00WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 58ms/step - loss: 0.0462 - accuracy: 0.0000e+00 Epoch 15/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0443 - accuracy: 0.0000e+00WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 87ms/step - loss: 0.0444 - accuracy: 0.0000e+00 Epoch 16/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0472 - accuracy: 0.0000e+00WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 56ms/step - loss: 0.0466 - accuracy: 0.0000e+00 Epoch 17/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0411 - accuracy: 0.0000e+00WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 58ms/step - loss: 0.0406 - accuracy: 0.0000e+00 Epoch 18/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0413 - accuracy: 0.0000e+00WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 81ms/step - loss: 0.0413 - accuracy: 0.0000e+00 Epoch 19/20 5/6 [========================>.....] - ETA: 0s - loss: 0.0414 - accuracy: 0.0000e+00WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 59ms/step - loss: 0.0423 - accuracy: 0.0000e+00 Epoch 20/20 6/6 [==============================] - ETA: 0s - loss: 0.0483 - accuracy: 0.0061WARNING:tensorflow:Can save best model only with val_loss available, skipping. 6/6 [==============================] - 0s 76ms/step - loss: 0.0483 - accuracy: 0.0061
plt.plot(his.history['loss'])
plt.plot(his.history['accuracy'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['loss','accuracy'], loc='upper right')
plt.show()
actual_prices = test_data['Open'].values
total_dataset = pd.concat((train_data['Open'], test_data['Open']), axis=0)
model_inputs = total_dataset[len(total_dataset)-len(test_data)-prediction_days:].values
model_inputs = model_inputs.reshape(-1,1)
model_inputs = scaler.transform(model_inputs)
x_test = []
for x in range(prediction_days,len(model_inputs)):
x_test.append(model_inputs[x-prediction_days:x,0])
x_test = np.array(x_test)
x_test = np.reshape(x_test,(x_test.shape[0],x_test.shape[1],1))
predicted_prices = model.predict(x_test)
predicted_prices = scaler.inverse_transform(predicted_prices)
plt.plot(actual_prices, color='magenta', label=f"Actual price")
plt.plot(predicted_prices, color= 'green', label=f"Predicted 5-days-after price")
plt.title(f"DJIA Stock")
plt.xlabel("Days in test period")
plt.ylabel(f"Price")
plt.legend()
plt.show()
test_data['predict']=predicted_prices
test_data[0:10]
C:\Users\krudko\AppData\Local\Temp\ipykernel_11344\2743983796.py:1: SettingWithCopyWarning: A value is trying to be set on a copy of a slice from a DataFrame. Try using .loc[row_indexer,col_indexer] = value instead See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
| Date | Open | High | Low | Close | Volume | Adj Close | predict | |
|---|---|---|---|---|---|---|---|---|
| 107 | 2016-01-29 | 16090.259766 | 16466.300781 | 16090.259766 | 16466.300781 | 217940000 | 16466.300781 | 16289.003906 |
| 106 | 2016-02-01 | 16453.630859 | 16510.980469 | 16299.469727 | 16449.179688 | 114450000 | 16449.179688 | 16266.957031 |
| 105 | 2016-02-02 | 16420.210938 | 16420.210938 | 16108.440430 | 16153.540039 | 126210000 | 16153.540039 | 16263.038086 |
| 104 | 2016-02-03 | 16186.200195 | 16381.690430 | 15960.450195 | 16336.660156 | 141870000 | 16336.660156 | 16276.891602 |
| 103 | 2016-02-04 | 16329.669922 | 16485.839844 | 16266.160156 | 16416.580078 | 131490000 | 16416.580078 | 16300.595703 |
| 102 | 2016-02-05 | 16417.949219 | 16423.630859 | 16129.809570 | 16204.969727 | 139010000 | 16204.969727 | 16331.568359 |
| 101 | 2016-02-08 | 16147.509766 | 16147.509766 | 15803.549805 | 16027.049805 | 165880000 | 16027.049805 | 16369.849609 |
| 100 | 2016-02-09 | 16005.410156 | 16136.620117 | 15881.110352 | 16014.379883 | 127740000 | 16014.379883 | 16408.748047 |
| 99 | 2016-02-10 | 16035.610352 | 16201.889648 | 15899.910156 | 15914.740234 | 122290000 | 15914.740234 | 16441.806641 |
| 98 | 2016-02-11 | 15897.820312 | 15897.820312 | 15503.009766 | 15660.179688 | 172070000 | 15660.179688 | 16466.044922 |
fig=make_subplots(specs=[[{"secondary_y":False}]])
fig.add_trace(go.Scatter(x=train_data['Date'],y=train_data['Open'],name="Train Actual"),secondary_y=False,)
fig.add_trace(go.Scatter(x=test_data['Date'],y=test_data['Open'],name="Test Actual"),secondary_y=False,)
fig.add_trace(go.Scatter(x=test_data['Date'],y=test_data['predict'],name="Predicted 5-days after price"),secondary_y=False,)
fig.update_layout(autosize=False,width=900,height=500,title_text="DJIA")
fig.update_xaxes(title_text="year")
fig.update_yaxes(title_text="prices",secondary_y=False)
fig.show()
real_data = [model_inputs[len(model_inputs)+1-prediction_days:len(model_inputs+1),0]]
real_data = np.array(real_data)
real_data = np.reshape(real_data,(real_data.shape[0],real_data.shape[1],1))
print(real_data.shape)
(1, 29, 1)